AI safety

AI safety is an interdisciplinary field concerned with preventing accidents, misuse, or other harmful consequences that could result from artificial intelligence (AI) systems. It encompasses machine ethics and AI alignment (which aim to make AI systems moral and beneficial), monitoring AI systems for risks and making them highly reliable. Beyond AI research, it involves developing norms and policies that promote safety.


© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search